A Note on Bound for Jensen-Shannon Divergence by Jeffreys
نویسنده
چکیده
We present a lower bound on the Jensen-Shannon divergence by the Jeffrers’ divergence when pi ≥ qi is satisfied. In the original Lin's paper [IEEE Trans. Info. Theory, 37, 145 (1991)], where the divergence was introduced, the upper bound in terms of the Jeffreys was the quarter of it. In view of a recent shaper one reported by Crooks, we present a discussion on upper bounds by transcendental functions of Jeffreys by comparing those values for a binary distribution.
منابع مشابه
A family of statistical symmetric divergences based on Jensen's inequality
We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen’s inequality for a convex functional generator. In particular, this family unifies the celebrated Jeffreys divergence with the Jensen-Shannon divergence when the Shannon entropy generator is chosen. We then design a generic algorithm to compute the unique centroid defined as the minimum average d...
متن کاملInequalities between the Jenson-Shannon and Jeffreys divergences
The last line follows from the previous line by a second application of the same Jensen inequality. Since the J-divergence ranges between zero and positive in nity, whereas the Jensen-Shannon divergence ranges between zero and ln 2 [i.e. 1 bit], this inequality has the correct limits for identical (pi = qi, JS(p;q) = Jeffreys(p;q) = 0) and orthogonal (piqi = 0, JS(p;q) = ln 2, Jeffreys(p;q) = +...
متن کاملMeasures of trajectory ensemble disparity in nonequilibrium statistical dynamics
Many interesting divergence measures between conjugate ensembles of nonequilibrium trajectories can be experimentally determined from the work distribution of the process. Herein, we review the statistical and physical significance of several of these measures, in particular the relative entropy (dissipation), Jeffreys divergence (hysteresis), Jensen–Shannon divergence (timeasymmetry), Chernoff...
متن کاملBounds on Non-Symmetric Divergence Measures in Terms of Symmetric Divergence Measures
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leibler [13] relative information and Jeffreys [12] Jdivergence. Sibson [17] Jensen-Shannon divergence has also found its applications in the literature. The author [20] studied a new divergence measures based on arithmetic and geometric means....
متن کاملJensen divergence based on Fisher's information
The measure of Jensen-Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, is very sensitive to the fluctuations of the probability distributions because it is controlled by the (local) Fisher information, which is a gradient functional of the distribution. So, it is appropriate and ...
متن کامل